Limited-Memory Reduced-Hessian Methods for Large-Scale Unconstrained Optimization

نویسندگان

  • Philip E. Gill
  • Michael W. Leonard
چکیده

Limited-memory BFGS quasi-Newton methods approximate the Hessian matrix of second derivatives by the sum of a diagonal matrix and a fixed number of rank-one matrices. These methods are particularly effective for large problems in which the approximate Hessian cannot be stored explicitly. It can be shown that the conventional BFGS method accumulates approximate curvature in a sequence of expanding subspaces. This allows an approximate Hessian to be represented using a smaller reduced matrix that increases in dimension at each iteration. When the number of variables is large, this feature may be used to define limited-memory reduced-Hessian methods in which the dimension of the reduced Hessian is limited to save storage. Limited-memory reduced-Hessian methods have the benefit of requiring half the storage of conventional limited-memory methods. In this paper, we propose a particular reduced-Hessian method with substantial computational advantages compared to previous reduced-Hessian methods. Numerical results from a set of unconstrained problems in the CUTE test collection indicate that our implementation is competitive with the limited-memory codes L-BFGS and L-BFGS-B.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A limited memory adaptive trust-region approach for large-scale unconstrained optimization

This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...

متن کامل

Criterion for the Limited Memory BFGS Algorithm for Large Scale Nonlinear Optimization

This paper studies recent modi cations of the limited memory BFGS (L-BFGS) method for solving large scale unconstrained optimization problems. Each modi cation technique attempts to improve the quality of the L-BFGS Hessian by employing (extra) updates in certain sense. Because at some iterations these updates might be redundant or worsen the quality of this Hessian, this paper proposes an upda...

متن کامل

Extra-Updates Criterion for the Limited Memory BFGS Algorithm for Large Scale Nonlinear Optimizatio

This paper studies recent modifications of the limited memory BFGS (L-BFGS) method for solving large scale unconstrained optimization problems. Each modification technique attempts to improve the quality of the L-BFGS Hessian by employing (extra) updates in a certain sense. Because at some iterations these updates might be redundant or worsen the quality of this Hessian, this paper proposes an ...

متن کامل

Dynamic scaling based preconditioning for truncated Newton methods in large scale unconstrained optimization

This paper deals with the preconditioning of truncated Newton methods for the solution of large scale nonlinear unconstrained optimization problems. We focus on preconditioners which can be naturally embedded in the framework of truncated Newton methods, i.e. which can be built without storing the Hessian matrix of the function to be minimized, but only based upon information on the Hessian obt...

متن کامل

On the Behavior of Damped Quasi-Newton Methods for Unconstrained Optimization

We consider a family of damped quasi-Newton methods for solving unconstrained optimization problems. This family resembles that of Broyden with line searches, except that the change in gradients is replaced by a certain hybrid vector before updating the current Hessian approximation. This damped technique modifies the Hessian approximations so that they are maintained sufficiently positive defi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 14  شماره 

صفحات  -

تاریخ انتشار 2003